Asymptotic distributions associated to Oja's learning equation for neural networks

نویسندگان

  • Jean Pierre Delmas
  • Jean-François Cardoso
چکیده

In this paper, we perform a complete asymptotic performance analysis of the stochastic approximation algorithm (denoted subspace network learning algorithm) derived from Oja's learning equation, in the case where the learning rate is constant and a large number of patterns is available. This algorithm drives the connection weight matrix W to an orthonormal basis of a dominant invariant subspace of a covariance matrix. Our approach consists in associating to this algorithm a second stochastic approximation algorithm that governs the evolution of WWT to the projection matrix onto this dominant invariant subspace. Then, using a general result of Gaussian approximation theory, we derive the asymptotic distribution of the estimated projection matrix. Closed form expressions of the asymptotic covariance of the projection matrix estimated by the SNL algorithm, and by the smoothed SNL algorithm that we introduce, are given in case of independent or correlated learning patterns and are further analyzed. It is found that the structures of these asymptotic covariance matrices are similar to those describing batch estimation techniques. The accuracy or our asymptotic analysis is checked by numerical simulations and it is found to be valid not only for a "small" learning rate but in a very large domain. Finally, improvements brought by our smoothed SNL algorithm are shown, such as the learning speed/misadjustment tradeoff and the deviation from orthonormality.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Asymptotic distributions of Neumann problem for Sturm-Liouville equation

In this paper we apply the Homotopy perturbation method to derive the higher-order asymptotic distribution of the eigenvalues and eigenfunctions associated with the linear real second order equation of Sturm-liouville type on $[0,pi]$ with Neumann conditions $(y'(0)=y'(pi)=0)$ where $q$ is a real-valued Sign-indefinite number of $C^{1}[0,pi]$ and $lambda$ is a real parameter.

متن کامل

Global convergence of Oja's subspace algorithm for principal component extraction

Oja's principal subspace algorithm is a well-known and powerful technique for learning and tracking principal information in time series. A thorough investigation of the convergence property of Oja's algorithm is undertaken in this paper. The asymptotic convergence rates of the algorithm is discovered. The dependence of the algorithm on its initial weight matrix and the singularity of the data ...

متن کامل

An Exactly Solvable Model of Unsupervised Learning

A model for unsupervised learning from N {dimensional data is studied. Random training examples are drawn such that the distribution of their overlaps with a vector B 2 IR N is a mixture of two Gaussians of unit width and a separation. A student vector is generated by an online algorithm, using each example only once. The evolution of its overlap with B can be calculated exactly in the thermody...

متن کامل

Numerical solution of fuzzy linear Fredholm integro-differential equation by \fuzzy neural network

In this paper, a novel hybrid method based on learning algorithmof fuzzy neural network and Newton-Cotesmethods with positive coefficient for the solution of linear Fredholm integro-differential equation of the second kindwith fuzzy initial value is presented. Here neural network isconsidered as a part of large field called neural computing orsoft computing. We propose alearning algorithm from ...

متن کامل

A novel fast learning algorithms for time-delay neural networks

To counter the drawbacks that Waibel 's time-delay neural networks (TDW) take up long training time in phoneme recognition, the paper puts forward several improved fast learning methods of 1PW. Merging unsupervised Oja's rule and the similar error back propagation algorithm for initial training of 1PhW weights can effectively increase convergence speed, at the same time error firnction almost m...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEEE transactions on neural networks

دوره 9 6  شماره 

صفحات  -

تاریخ انتشار 1998